295 research outputs found

    Comment: Classifier Technology and the Illusion of Progress

    Full text link
    Comment on Classifier Technology and the Illusion of Progress [math.ST/0606441]Comment: Published at http://dx.doi.org/10.1214/088342306000000024 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Comment: Classifier Technology and the Illusion of Progress

    Full text link
    Comment on Classifier Technology and the Illusion of Progress [math.ST/0606441]Comment: Published at http://dx.doi.org/10.1214/088342306000000042 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Regularized Discriminant Analysis

    Get PDF

    Regularization Paths for Generalized Linear Models via Coordinate Descent

    Get PDF
    We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multi- nomial regression problems while the penalties include âÂÂ_1 (the lasso), âÂÂ_2 (ridge regression) and mixtures of the two (the elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path. The methods can handle large problems and can also deal efficiently with sparse features. In comparative timings we find that the new algorithms are considerably faster than competing methods.

    Regularization Paths for Cox's Proportional Hazards Model via Coordinate Descent

    Get PDF
    We introduce a pathwise algorithm for the Cox proportional hazards model, regularized by convex combinations of l_1 and l_2 penalties (elastic net). Our algorithm fits via cyclical coordinate descent, and employs warm starts to find a solution along a regularization path. We demonstrate the efficacy of our algorithm on real and simulated data sets, and find considerable speedup between our algorithm and competing methods.

    Projection Pursuit Regression

    Get PDF

    On bagging and nonlinear estimation

    Get PDF
    We propose an elementary model for the way in which stochastic perturbations of a statistical objective function, such as a negative log-likelihood, produce excessive nonlinear variation of the resulting estimator. Theory for the model is transparently simple, and is used to provide new insight into the main factors that affect performance of bagging. In particular, it is shown that if the perturbations are sufficiently symmetric then bagging will not significantly increase bias; and if the perturbations also offer opportunities for cancellation then bagging will reduce variance. For the first property it is sufficient that the third derivative of a perturbation vanish locally, and for the second, that second and fourth derivatives have opposite signs. Functions that satisfy these conditions resemble sinusoids. Therefore, our results imply that bagging will reduce the nonlinear variation, as measured by either variance or mean-squared error, produced in an estimator by sinusoid-like, stochastic perturbations of the objective function. Analysis of our simple model also suggests relationships between the results obtained using different with-replacement and without-replacement bagging schemes. We simulate regression trees in settings that are far more complex than those explicitly addressed by the model, and find that these relationships are generally borne out
    • …
    corecore